Increasing and Decreasing Returns and Losses in Mutual Information Feature Subset Selection
نویسندگان
چکیده
منابع مشابه
Increasing and Decreasing Returns and Losses in Mutual Information Feature Subset Selection
Mutual information between a target variable and a feature subset is extensively used as a feature subset selection criterion. This work contributes to a more thorough understanding of the evolution of the mutual information as a function of the number of features selected. We describe decreasing returns and increasing returns behavior in sequential forward search and increasing losses and decr...
متن کاملSpeeding Up Feature Subset Selection Through Mutual Information Relevance Filtering
A relevance filter is proposed which removes features based on the mutual information between class labels and features. It is proven that both feature independence and class conditional feature independence are required for the filter to be statistically optimal. This could be shown by establishing a relationship with the conditional relative entropy framework for feature selection. Removing f...
متن کاملOn the Feature Selection Criterion Proposed in ‘Gait Feature Subset Selection by Mutual Information’
Abstract Recently, Guo and Nixon [1] proposed a feature selection method based on maximizing I(x; Y ), the multidimensional mutual information between feature vector x and class variable Y . Because computing I(x; Y ) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y ) as the criterion for feature selection. We show that Guo and Nixon’s criterion originates from ap...
متن کاملSearching optimal feature subset using mutual information
A novel feature selection methodology is proposed with the concept of mutual information. The proposed methodology effectively circumvents two major problems in feature selection process: to identify the irrelevancy and redundancy in the feature set, and to estimate the optimal feature subset for classification task.
متن کاملQuadratic Mutual Information Feature Selection
We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous dat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Entropy
سال: 2010
ISSN: 1099-4300
DOI: 10.3390/e12102144